feat: gate optimization by dof multiplier#116
Conversation
Summary of ChangesHello @AlexanderPietsch, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request enhances the robustness of parameter optimization by introducing a data sufficiency check. The primary goal is to prevent models from overfitting when trained on insufficient data, particularly in scenarios with many parameters. By dynamically adjusting the minimum required data based on the number of parameters and a configurable multiplier, the system ensures that optimization only proceeds when there's a reasonable amount of data to support meaningful results, thereby improving the reliability of optimized parameters. Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
This is the final PR Bugbot will review for you during this billing cycle
Your free Bugbot reviews will reset on March 9
Details
Your team is on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle for each member of your team.
To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.
| strategy=strat.name, | ||
| search_method=search_method, | ||
| ) | ||
| search_space = {} |
There was a problem hiding this comment.
Search space parameters silently dropped when gate triggers
High Severity
When the data-sufficiency gate triggers and clears search_space = {}, parameters that had multiple options are silently lost. They are never transferred into fixed_params, so the subsequent evaluate({}) call constructs full_params without them. The strategy then runs with missing parameters, likely producing incorrect results or failures. The search-space parameters need default values (e.g., first option from each list) added to fixed_params before clearing search_space.
There was a problem hiding this comment.
Code Review
The pull request introduces a data-sufficiency gate for parameter optimization, preventing overfitting on small datasets. This is a valuable addition for improving the robustness of the backtesting process. The changes are well-contained within src/backtest/runner.py and src/config.py, and the new configuration option param_dof_multiplier is properly integrated. The logging for skipped optimizations is clear and provides useful context. Overall, the changes are positive and enhance the system's reliability.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 80d78226dc
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| strategy=strat.name, | ||
| search_method=search_method, | ||
| ) | ||
| search_space = {} |
There was a problem hiding this comment.
Keep tunable params when skipping optimization
When data is below the new threshold, search_space is cleared entirely, so run_all() later calls evaluate({}) and drops every parameter that originally had multiple candidate values. For strategies that require those parameters (e.g., external adapters/classes expecting constructor args from the grid), this turns a previously valid run into generate_signals failures or silently changes behavior on short datasets. Instead of erasing the space, the gate should collapse each tunable parameter to a deterministic single value (such as the first option) so required keys are still passed.
Useful? React with 👍 / 👎.
|





Summary
Add a data‑sufficiency gate for parameter optimization that scales with degrees of freedom, to avoid overfitting on small samples.
Changes
Added param_dof_multiplier config (default 100) and updated example config.
Optimization is skipped when bars < max(2000, multiplier * n_params) with a log message.
Breaking changes: None (defaults applied for older configs).
How to Test
Checklist (KISS)
pre-commit run --all-files).envvalues are excludedRelated Issues/Links
Note
Low Risk
Small, localized change that only conditionally disables optimization on small datasets; main risk is unintentionally skipping tuning if the new thresholds are misconfigured.
Overview
Adds a data-sufficiency gate before parameter optimization: when a strategy has a non-empty
search_spaceandbars < max(param_min_bars, param_dof_multiplier * n_params), optimization is skipped and the run falls back to evaluating only fixed params (with a structured log explaining why).Introduces two new config fields,
param_dof_multiplier(default100) andparam_min_bars(default2000), and wires them intoload_configwith backward-compatible defaults for older configs.Written by Cursor Bugbot for commit 4ac86ed. This will update automatically on new commits. Configure here.